85 research outputs found

    Seperation & Recovery of Cu and Zn by Solvent Extraction - Electrowinning from the Sulphate Leach Liquor of Complex Sulphide Ore

    Get PDF
    Complex sulphide ores containing Cu, Zn, and Pb form an important source of base metals in India. The sulphides are first floated out and the buck sulphide concentrates are subjected to roasting in fluidised bed reactor, where the metal sulphides are converted to setluble sulphates. The roasted mass is leached with water when copper and zinc get solubilised leaving lead in the insoluble residue. The solution is processed further by solvent extraction technique to extract copper quantitatively using the commercial extractant, LIX 64N, The zinc remains unextracted and can be recovered by electro-winning. This paper describes the process developed and the results obtained on recovery of electrolytic grade- copper and zinc

    A model problem for conformal parameterizations of the Einstein constraint equations

    Full text link
    We investigate the possibility that the conformal and conformal thin sandwich (CTS) methods can be used to parameterize the set of solutions of the vacuum Einstein constraint equations. To this end we develop a model problem obtained by taking the quotient of certain symmetric data on conformally flat tori. Specializing the model problem to a three-parameter family of conformal data we observe a number of new phenomena for the conformal and CTS methods. Within this family, we obtain a general existence theorem so long as the mean curvature does not change sign. When the mean curvature changes sign, we find that for certain data solutions exist if and only if the transverse-traceless tensor is sufficiently small. When such solutions exist, there are generically more than one. Moreover, the theory for mean curvatures changing sign is shown to be extremely sensitive with respect to the value of a coupling constant in the Einstein constraint equations.Comment: 40 pages, 4 figure

    A process pattern model for tackling and improving big data quality

    Get PDF
    Data seldom create value by themselves. They need to be linked and combined from multiple sources, which can often come with variable data quality. The task of improving data quality is a recurring challenge. In this paper, we use a case study of a large telecom company to develop a generic process pattern model for improving data quality. The process pattern model is defined as a proven series of activities, aimed at improving the data quality given a certain context, a particular objective, and a specific set of initial conditions. Four different patterns are derived to deal with the variations in data quality of datasets. Instead of having to find the way to improve the quality of big data for each situation, the process model provides data users with generic patterns, which can be used as a reference model to improve big data quality

    Effects of chronic inflammatory bowel diseases on left ventricular structure and function: a study protocol

    Get PDF
    BACKGROUND: Experimental evidences suggest an increased collagen deposition in inflammatory bowel diseases (IBD). In particular, large amounts of collagen type I, III and V have been described and correlated to the development of intestinal fibrotic lesions. No information has been available until now about the possible increased collagen deposition far from the main target organ. In the hypothesis that chronic inflammation and increased collagen metabolism are reflected also in the systemic circulation, we aimed this study to evaluate the effects on left ventricular wall structure by assessing splancnic and systemic collagen metabolism (procollagen III assay), deposition (ultrasonic tissue characterization), and cardiac function (echocardiography) in patients with different long standing history of IBD, before and after surgery. METHODS: Thirty patients affected by active IBD, 15 with Crohn and 15 with Ulcerative Colitis, submitted to surgery will be enrolled in the study in a double blind fashion. They will be studied before the surgical operation and 6, 12 months after surgery. A control group of 15 healthy age and gender-matched subjects will also be studied. At each interval blood samples will be collected in order to assess the collagen metabolism; a transthoracic echocardiogram will be recorded for the subsequent determination of cardiac function and collagen deposition. DISCUSSION: From this study protocol we expect additional information about the association between IBD and cardiovascular disorders; in particular to address the question if chronic inflammation, through the altered collagen metabolism, could affect left ventricular structure and function in a manner directly related to the estimated duration of the disease

    Feasibility study of computed tomography colonography using limited bowel preparation at normal and low-dose levels study

    Get PDF
    The purpose was to evaluate low-dose CT colonography without cathartic cleansing in terms of image quality, polyp visualization and patient acceptance. Sixty-one patients scheduled for colonoscopy started a low-fiber diet, lactulose and amidotrizoic-acid for fecal tagging 2 days prior to the CT scan (standard dose, 5.8–8.2 mSv). The original raw data of 51 patients were modified and reconstructed at simulated 2.3 and 0.7 mSv levels. Two observers evaluated the standard dose scan regarding image quality and polyps. A third evaluated the presence of polyps at all three mSv levels in a blinded prospective way. All observers were blinded to the reference standard: colonoscopy. At three times patients were given questionnaires relating to their experiences and preference. Image quality was sufficient in all patients, but significantly lower in the cecum, sigmoid and rectum. The two observers correctly identified respectively 10/15 (67%) and 9/15 (60%) polyps ≥10 mm, with 5 and 8 false-positive lesions (standard dose scan). Dose reduction down to 0.7 mSv was not associated with significant changes in diagnostic value (polyps ≥10 mm). Eighty percent of patients preferred CT colonography and 13% preferred colonoscopy (P<0.001). CT colonography without cleansing is preferred to colonoscopy and shows sufficient image quality and moderate sensitivity, without impaired diagnostic value at dose-levels as low as 0.7 mSv

    Visualizing Big Data with augmented and virtual reality: challenges and research agenda

    Get PDF
    This paper provides a multi-disciplinary overview of the research issues and achievements in the field of Big Data and its visualization techniques and tools. The main aim is to summarize challenges in visualization methods for existing Big Data, as well as to offer novel solutions for issues related to the current state of Big Data Visualization. This paper provides a classification of existing data types, analytical methods, visualization techniques and tools, with a particular emphasis placed on surveying the evolution of visualization methodology over the past years. Based on the results, we reveal disadvantages of existing visualization methods. Despite the technological development of the modern world, human involvement (interaction), judgment and logical thinking are necessary while working with Big Data. Therefore, the role of human perceptional limitations involving large amounts of information is evaluated. Based on the results, a non-traditional approach is proposed: we discuss how the capabilities of Augmented Reality and Virtual Reality could be applied to the field of Big Data Visualization. We discuss the promising utility of Mixed Reality technology integration with applications in Big Data Visualization. Placing the most essential data in the central area of the human visual field in Mixed Reality would allow one to obtain the presented information in a short period of time without significant data losses due to human perceptual issues. Furthermore, we discuss the impacts of new technologies, such as Virtual Reality displays and Augmented Reality helmets on the Big Data visualization as well as to the classification of the main challenges of integrating the technology.publishedVersionPeer reviewe

    Normal parameter reduction algorithm in soft set based on hybrid binary particle swarm and biogeography optimizer

    Get PDF
    © 2019, Springer-Verlag London Ltd., part of Springer Nature. Existing classification techniques that are proposed previously for eliminating data inconsistency could not achieve an efficient parameter reduction in soft set theory, which effects on the obtained decisions. Meanwhile, the computational cost made during combination generation process of soft sets could cause machine infinite state, which is known as nondeterministic polynomial time. The contributions of this study are mainly focused on minimizing choices costs through adjusting the original classifications by decision partition order and enhancing the probability of searching domain space using a developed Markov chain model. Furthermore, this study introduces an efficient soft set reduction-based binary particle swarm optimized by biogeography-based optimizer (SSR-BPSO-BBO) algorithm that generates an accurate decision for optimal and sub-optimal choices. The results show that the decision partition order technique is performing better in parameter reduction up to 50%, while other algorithms could not obtain high reduction rates in some scenarios. In terms of accuracy, the proposed SSR-BPSO-BBO algorithm outperforms the other optimization algorithms in achieving high accuracy percentage of a given soft dataset. On the other hand, the proposed Markov chain model could significantly represent the robustness of our parameter reduction technique in obtaining the optimal decision and minimizing the search domain.Published versio
    corecore